# Semantic Search Optimization
Mxbai Rerank Base V2 Seq
Apache-2.0
A Transformer-based multilingual text ranking model supporting text relevance ranking tasks in 15 languages
Text Embedding
Transformers Supports Multiple Languages

M
michaelfeil
528
7
Nomic Embed Text V2 Moe Msmarco Bpr
This is a sentence-transformers model fine-tuned from nomic-ai/nomic-embed-text-v2-moe, which maps text to a 768-dimensional dense vector space for tasks such as semantic text similarity calculation.
Text Embedding
N
BlackBeenie
41
1
Cde Small V1
cde-small-v1 is a small sentence embedding model based on transformer architecture, excelling in multiple text classification, clustering, and retrieval tasks.
Text Embedding
Transformers

C
OrcaDB
90.62k
4
BGE M3 Ko
Apache-2.0
A Korean-English bilingual sentence embedding model optimized based on BAAI/bge-m3, supporting semantic text similarity, information retrieval, and other tasks
Text Embedding Supports Multiple Languages
B
dragonkue
29.78k
44
Bge M3 Gguf
MIT
GGUF quantized version of the bge-m3 embedding model, suitable for efficient text embedding tasks
Text Embedding
B
lm-kit
2,885
10
All Minilm L6 V2 With Attentions
Apache-2.0
This is an ONNX port of sentence-transformers/all-MiniLM-L6-v2, adjusted to return attention weights, specifically designed for BM42 search scenarios.
Text Embedding
Transformers English

A
Qdrant
450.93k
10
Mxbai Embed Large V1 Gguf
Apache-2.0
mxbai-embed-large-v1 is a sentence embedding model based on the BERT-large architecture, trained using AnglE loss function, supporting English text embedding, and offering multiple quantized versions to meet various needs.
Text Embedding English
M
ChristianAzinn
646
4
Bge Large Zh V1.5 Gguf
MIT
BAAI/bge-large-zh-v1.5 embedding models in GGUF format, both quantized and non-quantized, optimized for llama.cpp, delivering significant speedup on CPU with minimal precision loss.
Text Embedding
B
CompendiumLabs
1,213
12
Jina Embeddings V2 Base En
Jina Embeddings V2 Base is an efficient English text embedding model capable of converting text into high-dimensional vector representations, suitable for various natural language processing tasks.
Text Embedding
Transformers

J
Cohee
1,584
2
River Retriver 416data Testing
This is a sentence embedding model based on sentence-transformers, capable of mapping text to a 768-dimensional vector space, suitable for semantic search and text similarity calculation.
Text Embedding
R
li-ping
15
0
Lodestone Base 4096 V1
Apache-2.0
A sentence-transformers model developed by Hum, supporting 4096 tokens long text embedding, suitable for semantic search and clustering tasks
Text Embedding English
L
Hum-Works
132
11
E5 Small V2 Onnx
Apache-2.0
This is a sentence transformer model that maps text to a dense vector space, suitable for semantic search and clustering tasks.
Text Embedding English
E
nixiesearch
221
0
Msmarco Bert Base Dot V5 Fine Tuned AI
A semantic search model based on BERT architecture, optimized for information retrieval systems, capable of mapping text to a 768-dimensional vector space
Text Embedding
Transformers English

M
Adel-Elwan
18
0
Multi Qa Mpnet Base Dot V1
This is a semantic search model based on sentence-transformers, capable of mapping sentences and paragraphs into a 768-dimensional dense vector space.
Text Embedding
M
model-embeddings
772
1
Glucose Base Ja
Apache-2.0
GLuCoSE is a Japanese text embedding model based on LUKE, suitable for sentence similarity and semantic search tasks.
Text Embedding
Transformers Japanese

G
pkshatech
70.71k
32
Hindi Sentence Similarity Sbert
This is a Hindi sentence similarity model fine-tuned on the STS dataset, supporting semantic similarity calculation between Hindi sentences.
Text Embedding
Transformers Other

H
l3cube-pune
655
7
Sbert All MiniLM L6 With Pooler
Apache-2.0
An ONNX model based on sentence-transformers that maps text to a 384-dimensional vector space, suitable for semantic search and clustering tasks.
Text Embedding English
S
optimum
3,867
6
Sbert All MiniLM L12 With Pooler
Apache-2.0
This is an ONNX model based on sentence-transformers, capable of mapping sentences and paragraphs into a 384-dimensional dense vector space, suitable for tasks such as clustering or semantic search.
Text Embedding
Transformers English

S
vamsibanda
31
0
Msmarco Distilbert Base V4 Feature Extraction Pipeline
Apache-2.0
This is a sentence transformer model based on DistilBERT, specifically designed for feature extraction and sentence similarity calculation.
Text Embedding
Transformers

M
questgen
36
0
English Phrases Bible
Apache-2.0
A sentence embedding model based on DistilBert TAS-B, optimized for semantic search tasks, capable of mapping text to a 768-dimensional vector space
Text Embedding
Transformers

E
iamholmes
28
0
Dense Encoder Msmarco Distilbert Word2vec256k MLM 785k Emb Updated
A DistilBERT model with word2vec-initialized vocabulary, optimized for sentence similarity tasks and trained on the MS MARCO dataset
Text Embedding
Transformers

D
vocab-transformers
33
0
Dense Encoder Msmarco Distilbert Word2vec256k Emb Updated
A sentence embedding model based on the DistilBERT architecture, initialized with a 256k vocabulary and word2vec, trained on the MS MARCO dataset, suitable for sentence similarity computation and semantic search tasks.
Text Embedding
Transformers

D
vocab-transformers
31
0
Dense Encoder Msmarco Distilbert Word2vec256k MLM 445k Emb Updated
A sentence embedding model trained on the MS MARCO dataset, using a word2vec-initialized 256k vocabulary and DistilBERT architecture, suitable for semantic search and sentence similarity tasks
Text Embedding
Transformers

D
vocab-transformers
29
0
All Datasets V3 Mpnet Base
Apache-2.0
Sentence embedding model based on MPNet architecture, mapping text to a 768-dimensional vector space, suitable for semantic search and sentence similarity calculation
Text Embedding English
A
flax-sentence-embeddings
3,472
13
SBERT Large Nli V2
SBERT-large-nli-v2 is a large-scale sentence transformer model based on BERT, specifically designed for sentence similarity calculation and feature extraction.
Text Embedding
Transformers

S
Muennighoff
43
1
SGPT 1.3B Weightedmean Nli Bitfit
SGPT is a sentence embedding model based on the GPT architecture, specifically designed for semantic search tasks, generating sentence representations through weighted mean pooling.
Text Embedding
S
Muennighoff
206
0
Featured Recommended AI Models